Generalized co-sparse factor regression

نویسندگان

چکیده

Abstract Multivariate regression techniques are commonly applied to explore the associations between large numbers of outcomes and predictors. In real-world applications, often mixed types, including continuous measurements, binary indicators, counts, observations may also be incomplete. Building upon recent advances in mixed-outcome modeling sparse matrix factorization, generalized co-sparse factor (GOFAR) is proposed, which utilizes flexible vector linear model framework encodes outcome dependency through a singular value decomposition (SSVD) integrated natural parameter matrix. To avoid estimation notoriously difficult joint SSVD, GOFAR proposes both sequential parallel unit-rank procedures. By combining ideas alternating convex search majorization–minimization, an efficient algorithm developed solve problem implemented R package gofar . Extensive simulation studies two applications demonstrate effectiveness proposed approach.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Robust Estimation in Linear Regression with Molticollinearity and Sparse Models

‎One of the factors affecting the statistical analysis of the data is the presence of outliers‎. ‎The methods which are not affected by the outliers are called robust methods‎. ‎Robust regression methods are robust estimation methods of regression model parameters in the presence of outliers‎. ‎Besides outliers‎, ‎the linear dependency of regressor variables‎, ‎which is called multicollinearity...

متن کامل

Generalized Ridge Regression Estimator in Semiparametric Regression Models

In the context of ridge regression, the estimation of ridge (shrinkage) parameter plays an important role in analyzing data. Many efforts have been put to develop skills and methods of computing shrinkage estimators for different full-parametric ridge regression approaches, using eigenvalues. However, the estimation of shrinkage parameter is neglected for semiparametric regression models. The m...

متن کامل

Sparse Regression Codes

Developing computationally-efficient codes that approach the Shannon-theoretic limits for communication and compression has long been one of the major goals of information and coding theory. There have been significant advances towards this goal in the last couple of decades, with the emergence of turbo and sparse-graph codes in the ‘90s [1, 2], and more recently polar codes and spatially-coupl...

متن کامل

Conditional Sparse Linear Regression

Machine learning and statistics typically focus on building models that capture the vast majority of the data, possibly ignoring a small subset of data as “noise” or “outliers.” By contrast, here we consider the problem of jointly identifying a significant (but perhaps small) segment of a population in which there is a highly sparse linear regression fit, together with the coefficients for the ...

متن کامل

Globally sparse PLS regression

Partial least squares (PLS) regression combines dimensionality reduction and prediction using a latent variable model. It provides better predictive ability than principle component analysis by taking into account both the independent and response variables in the dimension reduction procedure. However, PLS suffers from over-fitting problems for few samples but many variables. We formulate a ne...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Computational Statistics & Data Analysis

سال: 2021

ISSN: ['0167-9473', '1872-7352']

DOI: https://doi.org/10.1016/j.csda.2020.107127